Generalized Measures of Divergence for Lifetime Distributions

نویسنده

  • Ilia Vonta
چکیده

Measures of divergence or discrepancy are used either to measure mutual information concerning two variables or to construct model selection criteria. In this paper we are focusing on divergence measures that are based on a class of measures known as Csiszar’s divergence measures. In particular, we propose a measure of divergence between residual lives of two items that have both survived up to some time t as well as a measure of divergence between past lives, both based on Csiszar’s class of measures. Furthermore, we derive properties of these measures and provide examples based on the Cox model and frailty or transformation models.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Dynamic Bayesian Information Measures

This paper introduces measures of information for Bayesian analysis when the support of data distribution is truncated progressively. The focus is on the lifetime distributions where the support is truncated at the current age t>=0. Notions of uncertainty and information are presented and operationalized by Shannon entropy, Kullback-Leibler information, and mutual information. Dynamic updatings...

متن کامل

Comparing the Shape Parameters of Two Weibull Distributions Using Records: A Generalized Inference

The Weibull distribution is a very applicable model for the lifetime data. For inference about two Weibull distributions using records, the shape parameters of the distributions are usually considered equal. However, there is not an appropriate method for comparing the shape parameters in the literature. Therefore, comparing the shape parameters of two Weibull distributions is very important. I...

متن کامل

Information Measures via Copula Functions

In applications of differential geometry to problems of parametric inference, the notion of divergence is often used to measure the separation between two parametric densities. Among them, in this paper, we will verify measures such as Kullback-Leibler information, J-divergence, Hellinger distance, -Divergence, … and so on. Properties and results related to distance between probability d...

متن کامل

Inference for the Type-II Generalized Logistic Distribution with Progressive Hybrid Censoring

This article presents the analysis of the Type-II hybrid progressively censored data when the lifetime distributions of the items follow Type-II generalized logistic distribution. Maximum likelihood estimators (MLEs) are investigated for estimating the location and scale parameters. It is observed that the MLEs can not be obtained in explicit forms. We provide the approximate maximum likelihood...

متن کامل

F-divergence Is a Generalized Invariant Measure between Distributions

Finding measures (or features) invariant to inevitable variations caused by non-linguistical factors (transformations) is a fundamental yet important problem in speech recognition. Recently, Minematsu [1, 2] proved that Bhattacharyya distance (BD) between two distributions is invariant to invertible transforms on feature space, and develop an invariant structural representation of speech based ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2009